Object detection model with Tensorflow.js, Deployed on RaspberryPi.
Flow
- The user owns or downloads a machine learning model in TensorFlow.js format.
- The user creates a Node-RED node for the TensorFlow.js model and wires the TensorFlow.js node in a Node-RED application.
- The user can deploy the Node-RED application locally.
- The user can access the Node-RED application from a browser and can trigger inferencing on images captured from a webcam.
- Alternatively, the user can deploy the Node-RED application to a Raspberry Pi.
- The device runs the Node-RED application and performs inferencing on images from a camera.
- The device can output to a connected speaker or take some other action depending on the inference results.
Configuration
-
GIT
sudo apt update sudo apt install git
-
NodeJS and NVM
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.35.3/install.sh | bash nvm install node
-
Node packages
npm install /home/pi/app/node-red-tensorflowjs/node-red-contrib-tfjs-object-detection npm install node-red-contrib-browser-utils node-red-contrib-play-audio node-red-contrib-image-output npm install node-red-node-pi-gpio
-
RaspberryPi Camera
npm install node-red-contrib-camerapi
-
USB Webcam
npm install node-red-contrib-usbcamera sudo apt install fswebcam
-
Code
git clone https://github.com/IBM/node-red-tensorflowjs cd node-red-tensorflowjs npm start
Hardware Setup
- Open the Camera Port on the Raspberry Pi
- Insert the Camera Cable
- Test the connection with the command:
raspistill -o Desktop/image.jpg
- Connect the sensor to pin 8 (or any other GPIO pin)
- Power the sensor with the RaspberryPi 5v pin
Import required Flows
- Make sure Node-RED is running
- Open a browser and go to your Node-RED Editor
- Click on the Node-RED Menu
- Click on Import
- Select the Clipboard tab
- Click on select a file to import
- Browse to and select one of the flow files in the cloned repo
- If trying things out locally on your browser, then use the
browser-flow.json
. - If using a Raspberry Pi with peripherals, then use the
raspberrypi-flows.json
.
- If trying things out locally on your browser, then use the
- Select Import to new flow
- Click Import
Deploy and run on a Raspberry Pi
- Double click on
Play Audio File
exec node and change the path in the append section to the path of a.wav
file of your choosing. ClickDone
when finished. - Click the Deploy button.
- Trigger the camera.
- Can manually trigger the snapshot by clicking the
Take Photo
inject node. - If using the motion sensor flow, motion near the sensor will trigger the camera.
- Can manually trigger the snapshot by clicking the
License
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so according to the MIT License